Healthcare

Augmentation, not Automation (AI in Mental Healthcare)

In many ways, the pandemic has spotlighted mental health and the need for faster access to support services. Not only has COVID-19 directly increased the number of adults with mental health issues (alarmingly described as a ‘tsunami of mental illness’ by the Royal College of Psychiatrists) but, much like other health sectors, fewer patients sought support during the lockdown.

As a result, there are swathes of new patients flowing into the system, on top of an existing backlog of people who require care. Put simply, mental health services are already playing catch up.

Patient referrals into the NHS Improving Access to Psychological Therapies (IAPT) programme – talk therapy services for adults – is expected to triple in the aftermath of COVID-19. However, while demand for talk therapy is at an all-time high, service capacities are unlikely to increase anytime soon.

Digital health – specifically artificial intelligence (AI) – is well-positioned to alleviate the pressure on our healthcare system, providing mental health support to those who need it and improving equality of service access. 

However, psychological therapy is, at its core, a very human discipline. Automating every element of care is not an appropriate solution. Rather, AI needs to focus on empowering clinicians so they have the tools to support patients at scale, rather than replacing human therapists altogether. 

For example, a particular pinch point in IAPT services is the clinical assessments process. Approximately 25% of the total IAPT budget is taken up by clinical assessments, according to the IAPT 2018 Annual Report. Clinical assessments are a necessary part of the care pathway – they screen the patient for service eligibility and point them to appropriate treatment options. However, they are also notoriously labour intensive (clinically and administratively). 

Elements of the assessment process can be streamlined through automation (though not all parts – for example, dealing with high-risk patients). By automating components of the clinical assessments process, it is possible to reduce the average three-week wait time that patients face for a clinical assessment, while also freeing up clinical resources.

In fact, pilot data from our own Limbic Access triage software – which augments clinical assessments within IAPTs – showed that clinicians saved 20 minutes of time per referral, 430 weeks of patient wait time and 86 hours of clinical time by automating elements of the clinical assessment process, after just one month in four IAPTs.

Perhaps most importantly, the resulting efficiency meant that patients bypassed the average 22 days wait for a human-led assessment and entered the most appropriate care pathway faster. In addition to improving patient experience, faster time to treatment is known to improve recovery rates.

The key to the success of our own solution was incorporating human elements into the software. The tool itself is a conversational AI developed in collaboration with IAPT clinical leads. It is specifically designed to mimic human conversation and in doing so, encourage high engagement rates and comfort patients in the same way that talking to someone in real life can. This approach has particularly helped in easing patients, which is rather crucial for those accessing mental health support services. In fact, 91.64% of patients said the tool helped them access care.

Conversational AI also provided our platform with additional value that we didn’t anticipate. Anecdotally, patients commented that they felt the support was personalised and they enjoyed the anonymity that accessing support digitally provides. For a process like clinical assessments, which is typically tedious and lengthy, this meant that the Limbic Access software made the overall experience more engaging and understandable for the patent. 

Furthermore, the platform is also able to respond to users in real-time, 24/7. In fact, just shy of half (42.1%) of referrals occurred outside of regular working hours, further reinforcing the argument that technology can help to speed up patient access to support, enable triage at scale and drive equality of service access. In turn, the data generated has helped our team tweak and develop the product accordingly. 

Now, as is the case with all new technologies, there are challenges we must consider. One particular example is the ethical implication of processing highly personal and sensitive mental health information and corresponding regulatory requirements.

This was something we spent a lot of time on.

As Limbic aims to revolutionise talk therapy through data science, data privacy is a key part of our organisational culture. We advocate for regulation in this space, and so set about achieving accreditations to ensure our own protocols are of the highest standard.

Finally, clinical assessments are not the only area of mental healthcare that could benefit from AI. But once again, we need to remember that AI adoption is not about replacing humans or removing them from the equation. Rather, AI and technology promise to augment the power of clinicians, giving them better data to make more informed decisions. This is the true value of AI in mental health.

Author

Related Articles

Back to top button